618 research outputs found

    A cost model for managing producer and consumer risk in availability demonstration testing.

    Get PDF
    Evaluation and demonstration of system performance against specified requirements is an essential element of risk reduction during the design, development, and production phases of a product lifecycle. Typical demonstration testing focuses on reliability and maintainability without consideration for availability. A practical reason considers the fact that demonstration testing for availability cannot be performed until very late in the product lifecycle when production representative units become available and system integration is completed. At this point, the requirement to field the system often takes priority over demonstration of availability performance. Without proper validation testing, the system can be fielded with reduced mission readiness and increased lifecycle cost. The need exists for availability demonstration testing (ADT) with emphasis on managing risk while minimizing the cost to the user. Risk management must ensure a test strategy that adequately considers producer and consumer risk objectives. This research proposes a methodology for ADT that provides managers and decision makers an improved ability to distinguish between high and low availability systems. A new availability demonstration test methodology is defined that provides a useful strategy for the consumer to mitigate significant risk without sacrificing the cost of time to field a product or capability. A surface navy electronic system case study supports the practical implementation of this methodology using no more than a simple spreadsheet tool for numerical analysis. Development of this method required three significant components which add to the existing body of knowledge. The first was a comparative performance assessment of existing ADT strategies to understand if any preferences exist. The next component was the development of an approach for ADT design that effectively considers time constraints on the test duration. The third component was the development of a procedure for an ADT design which provides awareness of risk levels in time-constrained ADT, and offers an evaluation of alternatives to select the best sub-optimal test plan. Comparison of the different ADT strategies utilized a simulation model to evaluate runs specified by a five-factor, full-factorial design of experiments. Analysis of variance verified that ADT strategies are significantly different with respect to output responses quality of decision and timeliness. Analysis revealed that the fixed number of failure ADT strategy has the lowest deviation from estimated producer and consumer risk, the measure of quality. The sequential ADT strategy had an average error 3.5 times larger and fixed test time strategies displayed error rates 8.5 to 12.7 larger than the best. The fixed test time strategies had superior performance in timeliness, measured by average test duration. The sequential strategy took 24% longer on average, and the fixed number of failure strategy took 2.5 times longer on average than the best. The research evaluated the application of a time constraint on ADT, and determined an increase in producer and consumer risk levels results when test duration is limited from its optimal value. It also revealed that substitution of a specified time constraint formatted for a specific test strategy produced a pair of dependent relationships between risk levels and the critical test value. These relationships define alternative test plans and could be analyzed in a cost context to compare and select the low cost alternative test plan. This result led to the specification of a support tool to enable a decision maker to understand changes to a and ß resulting from constraint of test duration, and to make decisions based on the true risk exposure. The output of this process is a time-constrained test plan with known producer and consumer risk levels

    REACTIVE STRENGTH INDEX-MODIFIED IN DIFFERENT PLYOMETRIC TASKS

    Get PDF
    The Reactive Strength Index-Modified (RSl,d) is a reliable method of measuring the explosiveness of an athlete during a range of plyometric exercises. The purpose of the current study was to measure the between-limb differences in RSlmod across three different plyometric tasks. Eleven recreationally active participants performed countermovement jumps, stop jumps and single-leg jumps. The study found no significant differences in RSlmod between dominant and non-dominant limbs across all three tasks (p>0.05), but did find RSI,d to be higher in the stop jump than a countermovement jump and single leg stop jump for both dominant and non-dominant limbs. These findings show RSlmod may not be an indicator of limb asymmetry, but may be useful for the coach when looking to develop ex@losive performance in an athlete or performer

    REACTIVE STRENGTH INDEX-MODIFIED IN DIFFERENT PLYOMETRIC TASKS

    Get PDF
    The Reactive Strength Index-Modified (RSlmod) is a reliable method of measuring the explosiveness of an athlete during a range of plyometric exercises. The purpose of the current study was to measure the between-limb differences in RSlmod across three different plyometric tasks. Eleven recreationally active participants performed countermovement jumps, stop jumps and single-leg jumps. The study found no significant differences in RSlmod between dominant and non-dominant limbs across all three tasks (p\u3e0.05), but did find RSI,d to be higher in the stop jump than a countermovement jump and single leg stop jump for both dominant and non-dominant limbs. These findings show RSlmod may not be an indicator of limb asymmetry, but may be useful for the coach when looking to develop exPlosive performance in an athlete or performer

    Atmospheric Effects of Energetic Particle Precipitation in the Arctic Winter 1978-1979 Revisted

    Get PDF
    [1] The Limb Infrared Monitor of the Stratosphere (LIMS) measured polar stratospheric enhancements of NO2 mixing ratios due to energetic particle precipitation (EPP) in the Arctic winter of 1978–1979. Recently reprocessed LIMS data are compared to more recent measurements from the Michelson Interferometer for Passive Atmospheric Sounding (MIPAS) and the Atmospheric Chemistry Experiment Fourier transform spectrometer (ACE-FTS) to place the LIMS measurements in the context of current observations. The amount of NOx (NO + NO2) entering the stratosphere that has been created by EPP in the mesosphere and lower thermosphere (EPP-NOx) has been quantified for the 1978–1979 and 2002–2003 through 2008–2009 Arctic winters. The NO2 enhancements in the LIMS data are similar to those in MIPAS and ACE-FTS data in the Arctic winters of 2002–2003, 2004–2005, 2006–2007, and 2007–2008. The largest enhancement by far is in 2003–2004 (∼2.2 Gmol at 1500 K), which is attributed to a combination of elevated EPP and unusual dynamics that led to strong descent in the upper stratosphere/lower mesosphere in late winter. The enhancements in 2005–2006 and 2008–2009, during which large stratospheric NOx enhancements were caused by a dynamical situation similar to that in 2003–2004, are larger than in all the other years (except 2003–2004) at 3000 K. However, by 2000 K the enhancements in 2005–2006 (2008–2009) are on the same order of magnitude as (smaller than) all other years. These results highlight the importance of the timing of the descent in determining the potential of EPP-NOx for reaching the middle stratosphere

    Marine biogeochemical responses to the North Atlantic Oscillation in a coupled climate model

    Get PDF
    In this study a coupled ocean-atmosphere model containing interactive marine biogeochemistry is used to analyze interannual, lagged, and decadal marine biogeochemical responses to the North Atlantic Oscillation (NAO), the dominant mode of North Atlantic atmospheric variability. The coupled model adequately reproduces present-day climatologies and NAO atmospheric variability. It is shown that marine biogeochemical responses to the NAO are governed by different mechanisms according to the time scale considered. On interannual time scales, local changes in vertical mixing, caused by modifications in air-sea heat, freshwater, and momentum fluxes, are most relevant in influencing phytoplankton growth through light and nutrient limitation mechanisms. At subpolar latitudes, deeper mixing occurring during positive NAO winters causes a slight decrease in late winter chlorophyll concentration due to light limitation and a 10%–20% increase in spring chlorophyll concentration due to higher nutrient availability. The lagged response of physical and biogeochemical properties to a high NAO winter shows some memory in the following 2 years. In particular, subsurface nutrient anomalies generated by local changes in mixing near the American coast are advected along the North Atlantic Current, where they are suggested to affect downstream chlorophyll concentration with 1 year lag. On decadal time scales, local and remote mechanisms act contemporaneously in shaping the decadal biogeochemical response to the NAO. The slow circulation adjustment, in response to NAO wind stress curl anomalies, causes a basin redistribution of heat, freshwater, and biogeochemical properties which, in turn, modifies the spatial structure of the subpolar chlorophyll bloom

    Toward a Sustainable Marketplace: Expanding Options and Benefits for Consumers

    Get PDF
    While popular interest in sustainable consumption continues to grow, there is a persistent gap between consumers’ typically positive explicit attitudes towards sustainability and their actual consumption behaviours. This gap can be explained, in part, by the belief that choosing to consume sustainably is both constraining and reduces individual-level benefits. While the belief that sustainable consumption depends on making trade-offs is true in some contexts, increasingly consumers are finding that more sustainable forms of consumption can provide both an expanded set of options and additional, individual-level benefits. In this essay, we discuss and illustrate an expanded set of options and benefits across the consumption cycle: from acquisition to usage and disposition. An underlying theme is the separation of material ownership from the extraction of consumer benefits across the consumption cycle. We believe that this ongoing evolution of products - and even business models - has the potential to simultaneously increase value to consumers as well as speed progress towards a more sustainable marketplace

    Calibration database for the Murchison Widefield Array All-Sky Virtual Observatory

    Get PDF
    We present a calibration component for the Murchison Widefield Array All-Sky Virtual Observatory (MWA ASVO) utilising a newly developed PostgreSQL database of calibration solutions. Since its inauguration in 2013, the MWA has recorded over thirty-four petabytes of data archived at the Pawsey Supercomputing Centre. According to the MWA Data Access policy, data become publicly available eighteen months after collection. Therefore, most of the archival data are now available to the public. Access to public data was provided in 2017 via the MWA ASVO interface, which allowed researchers worldwide to download MWA uncalibrated data in standard radio astronomy data formats (CASA measurement sets or UV FITS files). The addition of the MWA ASVO calibration feature opens a new, powerful avenue for researchers without a detailed knowledge of the MWA telescope and data processing to download calibrated visibility data and create images using standard radio-astronomy software packages. In order to populate the database with calibration solutions from the last six years we developed fully automated pipelines. A near-real-time pipeline has been used to process new calibration observations as soon as they are collected and upload calibration solutions to the database, which enables monitoring of the interferometric performance of the telescope. Based on this database we present an analysis of the stability of the MWA calibration solutions over long time intervals.Comment: 12 pages, 9 figures, Accepted for publication in PAS

    Friedmann-like equations for High Energy Area of Universe

    Full text link
    In this paper, evolution of the high energy area of universe, through the scenario of 5 dimensional (5D) universe, has been studied. For this purpose, we solve Einstein equations for 5D metric and 5D perfect fuid to derive Friedmann-like equations. Then we obtain the evolution of scale factor and energy density with respect to both space-like and time-like extra dimensions. We obtain the novel equations for the space-like extra dimension and show that the matter with zero pressure cannot exist in the bulk. Also, for dark energy fuid and vacuum fluid, we have both accelerated expansion and contraction in the bulk.Comment: 9 pages, Accepted to publication in IJTP 26 June 2012. arXiv admin note: substantial text overlap with arXiv:1202.497

    Improved Techniques for the Surveillance of the Near Earth Space Environment with the Murchison Widefield Array

    Full text link
    In this paper we demonstrate improved techniques to extend coherent processing intervals for passive radar processing, with the Murchison Widefield Array. Specifically, we apply a two stage linear range and Doppler migration compensation by utilising Keystone Formatting and a recent dechirping method. These methods are used to further demonstrate the potential for the surveillance of space with the Murchison Widefield Array using passive radar, by detecting objects orders of magnitude smaller than previous work. This paper also demonstrates how the linear Doppler migration methods can be extended to higher order compensation to further increase potential processing intervals.Comment: Presented at the 2019 IEEE Radar Conference in Boston earlier this yea
    • …
    corecore